The Journal of Heart and Lung Transplantation
○ Elsevier BV
All preprints, ranked by how well they match The Journal of Heart and Lung Transplantation's content profile, based on 10 papers previously published here. The average preprint has a 0.01% match score for this journal, so anything above that is already an above-average fit. Older preprints may already have been published elsewhere.
Schmauch, E.; Piening, B. D.; Xia, B.; Zhu, C.; Stern, J.; Zhang, W.; Dowdell, A.; Loza, B.; Mohebnasab, M.; Gragert, L.; Khalil, K.; Camellato, B.; de Oliveira, M. F.; O'Brien, D.; Weldon, E.; Lin, X.; Gao, H.; Kagermazova, L.; Kim, J.; Loupy, A.; Heguy, A.; Taylor, S.; Zhu, F.; Gao, S.; Gandla, D.; Reddy, K.; Chang, A.; Michael, B.; Jiang, L.; Jian, R.; Narula, N.; Linna-Kuosmanen, S.; Kaikkonen-Maatta, M.; Lorber, M.; Kellis, M.; Tatapudi, V.; Ayares, D.; Griesemer, A.; Mangiola, M.; Pass, H.; Snyder, M. P.; Boeke, J. D.; Montgomery, R. A.; Keating, B. J.
Show abstract
BackgroundRecent advances in xenotransplantation in living and decedent humans using pig xenografts have laid promising groundwork towards future emergency use and first in human trials. Major obstacles remain though, including a lack of knowledge of the genetic incompatibilities between pig donors and human recipients which may led to harmful immune responses against the xenograft or dysregulation of normal physiology. In 2022 two pig heart xenografts were transplanted into two brain-dead human decedents with a minimized immunosuppression regime, primarily to evaluate onset of hyper-acute antibody mediated rejection and sustained xenograft function over 3 days. MethodsWe performed multi-omic profiling to assess the dynamic interactions between the pig and human genomes in the first two pig heart-xenografts transplants into human decedents. To assess global and specific biological changes that may correlate with immune-related outcomes and xenograft function, we generated transcriptomic, lipidomic, proteomic and metabolomics datasets, across blood and tissue samples collected every 6 hours over the 3-day procedures. ResultsSingle-cell datasets in the 3-day pig xenograft-decedent models show dynamic immune activation processes. We observe specific scRNA-seq, snRNA-seq and geospatial transcriptomic changes of early immune-activation leading to pronounced downstream T-cell activity and hallmarks of early antibody mediated rejection (AbMR) and/or ischemia reperfusion injury (IRI) in the first xenograft recipient. Using longitudinal multiomic integrative analyses from blood in addition to antigen presentation pathway enrichment, we also observe in the first xeno-heart recipient significant cellular metabolism and liver damage pathway changes that correlate with profound physiological dysfunction whereas, these signals are not present in the other xenograft recipient. ConclusionsSingle-cell and multiomics approaches reveal fundamental insights into early molecular immune responses indicative of IRI and/or early AbMR in the first human decedent, which was not evident in the conventional histological evaluations.
Baker, C. E.; Stead, T. S.; Shah, A. R.; Pullmann, D.; Chinta, S.; Tran, D. L.; Brydges, H. T.; Laspro, M.; Gelb, B. E.; Rodriguez, E. D.; Rabbani, P. S.
Show abstract
PURPOSEThe various physiological profiles comprising vascularized composite allografts (VCAs) pose unique challenges to preservation. Minimizing ischemia, reperfusion injury, and rejection remains a primary focus of graft pre-treatments (PTs). Currently, the gold standard PT consists of flushing the graft and placing it in static cold storage in the University of Wisconsin solution. With this method, graft viability is limited to four to six hours. Prolonging this time limit will increase donor allocation radius, access to care, and positive patient outcomes. We aimed to evaluate novel PTs that could potentially enhance and lengthen VCA viability. METHODSFollowing PRISMA guidelines, we conducted a comprehensive literature search of Embase, Cochrane, and PubMed. Studies had to be published prior to June 15, 2022. PTs had to target cell physiology, rather than immunogenicity. We extracted data including study design, PT details, evaluation metrics, and outcomes. RESULTSWe identified thirteen studies, categorized into three groups: solution-based alterations to the gold standard, ex vivo perfusion, and other novel techniques. The incorporation of hydrogen sulfide and Perfadex as solutions in the gold standard protocol demonstrated a six-day delay in rejection and limited reperfusion injury markers, respectively. In one ex vivo perfusion study, after 24 hours of PT and 12 hours post-transplant, VCA muscle contractility remained close to normal. The gold standard PT did not demonstrate the same success. However, graft weight gain, up to 50% of baseline among the articles reviewed, is a prominent side effect of perfusion. Another technique, cryopreservation, displayed 90% graft failure by venous thrombosis, despite high free graft viability following two weeks of storage. CONCLUSIONSThis study of pre-treatment modalities found a variety of encouraging preservation techniques for grafts with high levels of tissue diversity. Ex vivo perfusion dominated PT innovation with promising results in preserving the viability and functionality of muscle, which is central to the restoration of movement. Future studies are necessary to evaluate long-term graft outcomes and to optimize PT protocols for extended preservation times to ensure clinical relevance.
Giarraputo, A.; COUTANCE, G.; Patel, J. K.; Fedrigo, M.; Aubert, O.; Dagobert, J.; Mezine, F.; Robin, B.; Rouvier, P.; Varnous, S.; Duong Van Huyen, J.-P.; Bruneval, P.; Angelini, A.; Kobashigawa, J.; Loupy, A.
Show abstract
Background and aimsTissular gene expression profiling has the potential to refine the diagnosis of cardiac allograft rejection. Contrary to whole-transcriptome approaches, targeted molecular profiling applicable to formalin-fixed paraffin-embedded (FFPE) endomyocardial biopsies (EMB) can be easily implemented in clinical practice. We aimed to develop and validate the first rejection molecular diagnostic system dedicated to heart transplantation (HTx). MethodsAn international multicenter study was designed, building a deep phenotyped cohort of HTx recipients recruited between 2011 and 2021 at 4 referral centers. Detailed donors, recipients, clinical, immunological, biological, and histological parameters were collected. EMBs were graded according to international working formulations. Tissue gene expression was analyzed on FFPE-EMB using the consensus Banff Human Organ Transplant gene set. Molecular classifiers of antibody-mediated (AMR) and acute cellular rejection (ACR) were built. Discrimination and calibration were assessed in the development and validation sets (NCT06436027). ResultsA total of 591 biopsies were included: 188 AMR (pAMR1(I+): n=51; pAMR1(H+): n=58; pAMR2-3: n=79), 289 ACR (1R n=174; 2-3R n=115) and 114 matched non-rejection cases. Biopsies were split in a derivation (n=476) and a validation set (n=115). AMR top significant transcripts were related to the IFN-gamma inducible pathway, endothelial activation, and monocyte-macrophage recruitment. ACR was characterized by transcripts related to T-cell receptor, CD3 receptor activation, and CD28 signaling. ACR and AMR molecular rejection models were strongly associated with the pathology severity of rejection and accurately identified rejection in the derivation (ROC-AUC: AMR=0.831, ACR:=0.837) and validation sets (ROC-AUC: AMR=0.812; ACR=0.849). Calibration was adequate. The robustness of the molecular classifiers were reinforced by various sensitivity analyzes. An automated report was developed to enhance the reproducibility and clinical applicability of the molecular analysis. ConclusionsIn this study, the first tissue-based rejection molecular diagnostic system applicable to FFPE- EMB and dedicated to heart transplantation rejection was developed and internally validated. This tool has the potential to refine the diagnosis of rejection.
Lu, H.; Jiang, J.; Huang, X.; Haig, A.; Gunaratnam, L.; Jevnikar, A.; Zhang, Z.-X.
Show abstract
BackgroundPANoptosis is an integrated form of cell death that combines features of pyroptosis, apoptosis, and necroptosis and is regulated by a complex network of signaling proteins. The roles of ADAR1 (adenosine deaminase acting on RNA 1) and RIPK1 (receptor-interacting serine/threonine-protein kinase 1) in orchestrating the ZBP1 (Z-DNA binding protein 1)-RIPK3 complex to mediate PANoptosis is not fully understood, particularly in the context of heart transplantation. ObjectiveThis study investigated how ADAR1 and RIPK1 coordinate the activation of the ZBP1-RIPK3 complex to mediate PANoptosis and its implications in mouse heart transplantation. MethodsUsing both in vitro and in vivo models, we analyzed the interactions between ADAR1, RIPK1, ZBP1, and RIPK3. We employed western blotting, and siRNA to elucidate the dynamics of these interactions. Additionally, we assessed the impact of ZBP1 on mouse heart transplantation outcomes. ResultsOur studies revealed that ADAR1 regulates the activation of the ZBP1-RIPK3 complex for PANoptosis. The interaction of ADAR1 with ZBP1 protected against Z-DNA-induced cell death by limiting activations of ZBP1 and RIPK3. In mouse heart transplantation study, we found that ZBP1 and its ligand Z-DNA/Z-RNA were significantly increased in the graft post-transplantation. Furthermore, ZBP1 deficiency in the heart graft inhibited cardiac PANoptosis, attenuated acute graft injury, and induced long-term graft survival. ConclusionThis study elucidates the role of ADAR1 in ZBP1-mediated PANoptosis. Inhibition of ZBP1 can prevent heart graft injury and rejection. Understanding these mechanisms provides valuable insights into the regulation of cell death and may inform the development of novel therapeutic strategies to improve transplant outcomes.
Tavolacci, S. C.; Okumura, K.; Isath, A.; Rodriguez, G.; De La Pena, C. B.; Shimamura, J.; Lansman, S. L.; Ohira, S.
Show abstract
ObjectiveHeart transplants utilizing donors from circulatory death (DCD) allografts are rapidly growing with the potential to expand the donor pool. However, little is known about the use of DCD donors for simultaneous heart and kidney transplants (SHKT) compared to SHKT using brain death donors (DBD). MethodsFrom May 22, 2020, to September 30, 2023, 1,129 adult patients received SHKT (DCD, N=91 vs. DBD, N=1,038), identified using the United Network for Organ Sharing database, excluding other multi-organ transplants and re-transplants. A 1:3 ratio propensity score matching was performed using 17 recipient characteristics and 7 donor characteristics. A total of 91 DCD and 273 DBD matched cases were compared. ResultsIn the unmatched cohort, DCD recipients were older (DCD: 60 vs. DBD: 58 years, p=0.03) and had a lower rate of dialysis at transplant (27% vs. 40%, p=0.03) and status 1 to 2 patients (43% vs. 72%, p<.001). Donors were younger (30 vs. 32 years, p=0.02) in the DCD group. In the matched cohort, kidney delayed graft function (27% vs. 22%, p=0.29) was comparable, as were recipient survival (p=0.19), heart graft survival (p=0.19), and kidney graft survival (p=0.17). In multivariate Cox proportional hazards analysis, donor type (DCD) was not associated with an increased risk of mortality (HR=1.69, 95% Cl 0.90-3.16, p=0.10). Sub-group analysis showed that survival and freedom from graft failures were comparable between different modes of DCD recovery. The centers performing both DCD- and DBD-SHKT showed significantly shorter waitlist days with comparable transplant outcomes compared to centers that only performed DBD-SHKT. Conclusions SHKT using DCD donors yields comparable survival and graft outcomes to those using DBD donors. These findings will guide treatment strategies for heart transplant candidates with kidney dysfunction, including the selection of donors and patients and safety net policy options.
Kim, D. D.; Madabhushi, A.; Margulies, K. B.; Peyster, E. G.
Show abstract
BackgroundCardiac allograft rejection (CAR) remains the leading cause of early graft failure after heart transplantation (HT). Current diagnostics, including histologic grading of endomyocardial biopsy (EMB) and blood-based assays, lack accurate predictive power for future CAR risk. We developed a predictive model integrating routine clinical data with quantitative morphologic features extracted from routine EMBs to demonstrate the precision-medicine potential of mining existing data sources in post-HT care. MethodsIn a retrospective cohort of 484 HT recipients with 1,188 EMB encounters within 6 months post-transplant, we extracted 370 quantitative pathology features describing lymphocyte infiltration and stromal architecture from digitized H&E-stained slides. Longitudinal clinical data comprising 268 variables--including lab values, immunosuppression records, and prior rejection history--were aggregated per patient. Using the XGBoost algorithm with rigorous cross-validation, we compared models based on four different data sources: clinical-only, morphology-only, cross-sectional-only, and fully integrated longitudinal data. The top predictors informed the derivation of a simplified Integrated Rejection Risk Index (IRRI), which relies on just 4 clinical and 4 morphology risk facts. Model performance was evaluated by AUROC, AUPRC, and time-to-event hazard ratios. ResultsThe fully integrated longitudinal model achieved superior predictive accuracy (AUROC 0.86, AUPRC 0.74). IRRI stratified patients into risk categories with distinct future CAR hazards: high-risk patients showed a markedly increased CAR risk (HR=6.15, 95% CI: 4.17-9.09), while low-risk patients had significantly reduced risk (HR=0.52, 95% CI: 0.33-0.84). This performance exceeded models based on just cross-sectional or single-domain data, demonstrating the value of multi-modal, temporal data integration. ConclusionsBy integrating longitudinal clinical and biopsy morphologic features, IRRI provides a scalable, interpretable tool for proactive CAR risk assessment. This precision-based approach could support risk-adaptive surveillance and immunosuppression management strategies, offering a promising pathway toward safer, more personalized post-HT care with the potential to reduce unnecessary procedures and improve outcomes. Clinical PerspectiveWhat is new? O_LICurrent tools for cardiac allograft monitoring detect rejection only after it occurs and are not designed to forecast future risk. This leads to missed opportunities for early intervention, avoidable patient injury, unnecessary testing, and inefficiencies in care. C_LIO_LIWe developed a machine learning-based risk index that integrates clinical features, quantitative biopsy morphology, and longitudinal temporal trends to create a robust predictive framework. C_LIO_LIThe Integrated Rejection Risk Index (IRRI) provides highly accurate prediction of future allograft rejection, identifying both high- and low-risk patients up to 90 days in advance - a capability entirely absent from current transplant management. C_LI What are the clinical implications? O_LIIntegrating quantitative histopathology with clinical data provides a more precise, individualized estimate of rejection risk in heart transplant recipients. C_LIO_LIThis framework has the potential to guide post-transplant surveillance intensity, immunosuppressive management, and patient counseling. C_LIO_LIAutomated biopsy analysis could be incorporated into digital pathology workflows, enabling scalable, multicenter application in real-world transplant care. C_LI
Li, S.; Bhattacharya, R.; Elsenousi, A. E.; Nordick, K. V.; Hassan, A. M.; Peer, S. B.; Hochman-Mendez, C.; Rosengart, T. K.; Liao, K. K.; Mondal, N. K.
Show abstract
This study compares myocardial injury responses in human donor hearts from donation after brain death (DBD) and donation after circulatory death (DCD), with a focus on myocardial membrane integrity, pyroptosis, and damage. Unlike DCD hearts, which are exposed to varying durations of functional warm ischemic times (fWITs), DBD hearts - never subjected to warm ischemia - served as controls. A total of twenty-four human hearts were procured, consisting of six from the DBD group and eighteen from the DCD group. All procured hearts were placed in cold normal saline and stored for up to six hours. Left ventricular biopsies were performed at 0, 2, 4, and 6 hours to assess plasma membrane repair proteins (Annexin A1, Dysferlin), pyroptosis markers (NLRP3, caspase-1, GSDMD-NT), and to evaluate edema and injury scores. Data suggest that DBD hearts maintained stable levels of plasma membrane repair proteins and showed no evidence of pyroptosis activation or significant injury throughout cold storage. In contrast, DCD hearts exhibited profound Annexin A1 depletion, early and progressive pyroptosis, elevated edema, and worsening histopathological injury - directly correlated with fWITs. These findings underscore that warm ischemia is a critical determinant of pyroptotic damage in donor hearts, and highlight the relative resistance of DBD hearts to such injury during preservation. For DCD hearts, strategies to enhance membrane repair capacity and inhibit pyroptosis should focus on the fWIT phase to assess donor heart quality and suitability for transplantation. New & NoteworthyThis study demonstrates that donor hearts procured after circulatory death (DCD) exhibit early Annexin A1 depletion and activation of the NLRP3/caspase-1/GSDMD-mediated pyroptosis pathway during cold storage - a phenomenon absent in brain-dead (DBD) donors. We establish a direct correlation between warm ischemia time and pyroptotic damage in DCD hearts. These findings identify Annexin A1 as a key mediator of ischemia injury and a promising therapeutic target to improve viability in marginal donor hearts.
O'Connor, M. J.; Vu, C.; Zhang, X.; Bennett, L.; Ahmed, H.; Edwards, J. J.; Lin, K. Y.; Li, Y.; Maeda, K.; Marcellus, B.; Monos, D.; Rossano, J. W.; Wittlieb-Weber, C.; Edelson, J. B.
Show abstract
BackgroundAllosensitization in pediatric heart transplantation (HT) is a challenging problem, with ongoing uncertainty as to optimal management strategy. Patients with congenital heart disease (CHD) have the highest risk of allosensitization and may be at risk for inferior outcomes following HT due to an accumulation of risk factors. MethodsThe United Network for Organ Sharing database was studied for all patients <18 years of age with CHD undergoing HT between April 2015 and December 2020. Patients were grouped into three categories of allosensitization status based on calculated panel reactive antibody (cPRA) obtained closest to the time of HT: nonsensitized (cPRA <10%), moderately sensitized (cPRA 10% - <80%), and highly sensitized (cPRA [≥]80%). The primary outcome measures of interest were one-year patient and graft survival following HT. Multivariable analysis was used to control for differences in preoperative clinical characteristics among sensitization categories. ResultsDuring the study period, 1086 patients with CHD underwent HT at a median of 3 years of age. Nonsensitized patients comprised 70% of the cohort; 22% were moderately sensitized and 9% were highly sensitized. Unadjusted 1-year mortality was 25% in the highly sensitized group compared to 8.7% in the nonsensitized group (P<0.001). After adjustment, highly sensitized patients were >3 times more likely to die within the first year than nonsensitized patients (HR 3.44, 95% CI 2.13 - 5.54, P<0.001). The relationship between cPRA and crossmatch result was also assessed using multivariable regression. A variety of crossmatches were performed, including cytotoxicity and flow cytometry modalities. Regardless of crossmatch result, highly sensitized patients had an increased risk of one-year mortality and graft failure compared to nonsensitized and moderately sensitized patients (HR 3.4, 95% CI 1.98 - 5.84, P<0.001 and HR 3.32, 95% CI 1.94 - 5.67, P<0.001 for one-year mortality and the composite of death or graft failure, respectively). ConclusionsHighly sensitized patients with CHD undergoing HT in the current era experience 25% 1-year mortality, which is significantly worse than less sensitized or nonsensitized patients. The magnitude of sensitization as reflected by cPRA, is highly predictive of adverse outcomes. These at-risk patients remain in need of more effective therapies for desensitization and management of the consequences of anti-HLA antibodies following HT. Clinical PerspectiveO_ST_ABSWhat is New?C_ST_ABSO_LIAllosensitization to HLA antigens is a common problem in pediatric heart transplantation, and outcomes remain suboptimal in allosensitized patients undergoing heart transplantation. Patients with CHD are at the highest risk of allosensitization. C_LIO_LIIn the current study, highly sensitized children with CHD undergoing heart transplantation in the current era experience 25% 1-year mortality following heart transplantation, which is significantly higher than in other groups undergoing transplantation. C_LIO_LIAllosensitization status, regardless of crossmatch result, independently predicted mortality following heart transplantation in this cohort. C_LI What are the Clinical Implications?O_LIHighly sensitized patients with CHD are much more likely to die in the first year following heart transplantation than less- or nonsensitized patients. They also experience higher rates of rejection, which contributes to morbidity and late mortality. C_LIO_LIMany efforts are made to minimize the likelihood of a positive crossmatch at the time of transplantation in order to optimize outcomes. However, the results of this study indicate that allosensitization status is the primary driver of outcomes when both allosensitization status and crossmatch result are taken into account. Therefore, continued development of new therapies for desensitization is warranted. C_LI
Perez Guerrero, A.; Vilchez-Tschischke, J. P.; Almenar Bonet, L.; Diez Gil, J. L.; Blasco Peiro, T.; Brugaletta, S.; Gomez Lara, J.; Gonzalez Costello, J.; Antuna, P.; Alonso Fernandez, V.; Sarnago Cebada, F.; Garcia-Cosio, M. D.; Hidalgo Lesmes, F.; Lopez Granados, A.; Lopez-Palop, R.; Paula Garrido, I.; Cardenal Piris, R. M.; Rangel Sousa, D.; Fuertes Ferre, G.
Show abstract
BackgroundAcute allograft rejection (AAR) is an important cause of morbi-mortality in heart transplant (HT) patients, particularly during the first year. Endomyocardial biopsy (EMB) is the "gold standard" to guide post-heart transplantation treatment. However, it is associated with complications that can be potentially serious. Index of microvascular resistance (IMR) is a specific physiological parameter to measure microvascular function. An increased IMR measured early after HT has been associated with acute cellular rejection (ACR), higher all-cause mortality and adverse cardiac events. As far as we know, no study has evaluated IMR impact on post-HT management (number of EMB performed). Our aim will be to assess if post-HT patient management may be modified based on IMR value. Study designThe IMR-HT study (NCT 06656065) is a multicenter, prospective study that will include post-HT consecutive stable patients undergoing coronary physiological assessment in the first three months and one year. Depending on IMR values the physician will be able to reduce the number of biopsies established in each center protocol. ConclusionsManagement after heart transplant (number of biopsies) could be modified depending on IMR values.
Nord, D.; Brunson, J. C.; Langerude, L.; Moussa, H.; Gill, B.; Machuca, T.; Rackauskas, M.; Sharma, A. K.; Lin, C.; Emtiazjoo, A.; Atkinson, C.
Show abstract
BACKGROUNDThere is an urgent need to better understand the pathophysiology of primary graft dysfunction (PGD) so that point-of-care methods can be developed to predict those at risk. Here we utilize a multiplex multivariable approach to define cytokine, chemokines, and growth factors in patient-matched biospecimens from multiple biological sites to identify factors predictive of PGD. METHODSBiospecimens were collected from patients undergoing bilateral LTx from three distinct sites: donor lung perfusate, post-transplant bronchoalveolar lavage (BAL) fluid (2h), and plasma (2h and 24h). A 71-multiplex panel was performed on each biospecimen. Cross-validated logistic regression (LR) and random forest (RF) machine learning models were used to determine whether analytes in each site or from combination of sites, with or without clinical data, could discriminate between PGD grade 0 (n = 9) and 3 (n = 8). RESULTSUsing optimal AUROC, BAL fluid at 2h was the most predictive of PGD (LR, 0.825; RF, 0.919), followed by multi-timepoint plasma (LR, 0.841; RF, 0.653), then perfusate (LR, 0.565; RF, 0.448). Combined clinical, BAL, and plasma data yielded strongest performance (LR, 1.000; RF, 1.000). Using a LASSO of the predictors obtained using LR, we selected IL-1RA, BCA-1, and Fractalkine, as most predictive of severe PGD. CONCLUSIONSBAL samples collected 2h post-transplant were the strongest predictors of severe PGD. Our machine learning approach not only identified novel cytokines not previously associated with PGD, but identified analytes that could be used as a point-of-care cytokine panel aimed at identifying those at risk for developing severe PGD.
Moroi, M.; Kosuri, Y.; Karcher, C.; Campbell, A.; Adamo, A.; Albino, D.; Batik, E.; Chan, C.; Fung, K.; Sekilic, M.; Faruqi, S. K.; Goergen, C. J.; Tamimi, M.; Takeda, K.; Ferrari, G.
Show abstract
ObjectivesDonation after circulatory death (DCD) expands the donor heart pool but is limited by warm ischemia and short preservation times. Hypothermic oxygenated perfusion (HOPE) extends storage beyond the 4-6-hour limit of static cold storage (SCS), yet cellular and molecular responses remain undefined. We evaluate cardiomyocyte integrity and functional recovery of DCD porcine hearts after in situ reanimation with normothermic regional perfusion (NRP) and preservation by SCS (2h) or HOPE (24h) and assess the impact of NRP under DCD conditions. MethodsNine Yorkshire pigs underwent DCD cardiectomy. Six animals experienced 15 min warm ischemia followed by 60 min NRP. Three hearts were preserved for 2h with SCS and three for 24h with HOPE. A second DCD group (n=3) underwent direct procurement without NRP and 2h of HOPE preservation. All hearts were reanimated by normothermic machine perfusion to assess rhythm and contractility. Cardiomyocyte viability, transcriptomics, and metabolomics were analyzed. ResultsAfter DCD+NRP, 2h SCS preserved intact cardiomyocyte viability. HOPE maintained measurable, though reduced, viability at 24h, while 24h SCS failed even under donation after brain death (DBD) conditions. Transcriptomic and metabolomic analyses showed marginal differences between 2h SCS and 24h HOPE. All 24h HOPE hearts regained sinus rhythm but showed reduced contractility vs 2h SCS. Without NRP, 2h HOPE hearts showed the lowest viability and contractility. ConclusionsHOPE supports extended preservation of DCD hearts, but viability and function decline by 24h. NRP is essential for functional recovery of hearts preserved at hypothermic temperatures in a porcine preclinical DCD model. Graphical Abstract O_FIG O_LINKSMALLFIG WIDTH=200 HEIGHT=186 SRC="FIGDIR/small/25337185v1_ufig1.gif" ALT="Figure 1"> View larger version (69K): org.highwire.dtl.DTLVardef@1639a35org.highwire.dtl.DTLVardef@210e49org.highwire.dtl.DTLVardef@30f455org.highwire.dtl.DTLVardef@1f904db_HPS_FORMAT_FIGEXP M_FIG C_FIG
Movahed, M. R.; Namazi, M. J.; Rezasoltani, M.; Hashemzadeh, M.
Show abstract
BackgroundCardiac Allograft vasculopathy (CAV) is a significant cause of late transplant failure. Using a large database, the studys objective was to assess traditional and infectious risk factors linked to the occurrence and severity of CAV. MethodUsing the large inpatient sample database (NIS), we evaluated any association between CAV and risk factors and infectious viral agents. Additionally, we assessed the severity of CAV based on the occurrence of revascularizations. ResultsA total of 78,330 heart transplant recipients were identified. CAV was diagnosed in 1,015 patients overall. Patients with CAV had a higher mortality rate (4.4% vs 2.1%, OR: 2.09 CI 1.08-4.03 p=0.03). All known traditional risk factors and baseline characteristics, including gender, race, hypertension, hyperlipidemia, diabetes mellitus, and smoking, were not linked to the existence of CAV, except for being younger (mean age 56 vs 59 years). Furthermore, a history of infectious mononucleosis strongly correlated with CAV (OR:8.9 CI 2.68-29.6 p<0.001). Being younger not only increases the possibility of the development of CAV but also increases the probability of undergoing coronary bypass surgery after a heart transplant. Influenza and other forms of viral infections, such as Cytomegalovirus, did not correlate with the presence of CAV. ConclusionYounger age was associated with CAV but no other traditional risk factors. Infectious mononucleosis, the only infectious agent correlating with CAV, had a very high association with CAV, warranting further investigation.
Camillo, C.; Moroi, M.; Kosuri, Y.; Campbell, A.; Adamo, A.; Patel, K.; Karcher, C.; Bauer, S.; Albino, D.; Batik, E.; Peng, T.; Pei, L.; Chan, C.; Fung, K.; Sekilic, M.; Nandakumar, R.; Faridmoayer, E.; Kho, C.; Bernardi, B.; Romanov, A.; Tamimi, M.; Grau, J.; Takeda, K.; Ferrari, G.
Show abstract
BackgroundEx vivo oxygenated perfusion systems is a promising approach to extend cardiac allograft preservation beyond the typical 4-6h limit allowed by static-cold-storage (SCS). Hypothermic oxygenated perfusion (HOPE) has been proven to safely preserve donor hearts, yet its underlying molecular mechanisms have not been extensively evaluated. ObjectivesThe aim of the study is to characterize cardiomyocyte viability, transcriptomic and metabolomic responses, and functional recovery of porcine hearts preserved with HOPE for up to 48h, including evaluating their ability to regain sinus rhythm following bench-top normothermic reperfusion (NMP). MethodsSeventeen Yorkshire pigs underwent donor cardiectomy. In the first arm, ten hearts were preserved for up to 48h using either SCS (n=5) or HOPE (n=5). Endomyocardial biopsies were collected at 0, 12, 24, and 48h for histology, RNA sequencing, flow cytometry, and metabolomics. In the second arm, six HOPE-preserved hearts (3h, 24h, 48h) and one SCS-preserved heart (24h) underwent 2h NMP to simulate transplantation and assess reanimation. ResultsHOPE preserved cardiomyocyte viability and structural integrity for 48h, in contrast to SCS in both arms of the study. RNA sequencing and untargeted metabolomics revealed conserved energy-substrate profiles in HOPE and progressive ischemic metabolite accumulation in SCS. All HOPE hearts regained stable sinus rhythm. ConclusionsHOPE enables 48h ex vivo heart preservation while maintaining cardiomyocyte integrity, normal gross and microscopic architecture, and rapid functional recovery on bench-top reperfusion in a preclinical model. These findings establish a foundation for redefining clinical preservation times and widening geographic donor access. GRAPHICAL ABSTRACT O_FIG O_LINKSMALLFIG WIDTH=200 HEIGHT=176 SRC="FIGDIR/small/25335053v1_ufig1.gif" ALT="Figure 1"> View larger version (54K): org.highwire.dtl.DTLVardef@6c4feorg.highwire.dtl.DTLVardef@6f00bcorg.highwire.dtl.DTLVardef@1993bfdorg.highwire.dtl.DTLVardef@1319f76_HPS_FORMAT_FIGEXP M_FIG C_FIG
Hullin, R.; Pitta Gros, B.; Rocca, A.; Laptseva, N.; Martinelli, M. V.; Flammer, A. J.; Lu, H.; Meyer, P.; Leuenberger, N.; Mueller, M.
Show abstract
BackgroundIron metabolism disorder is highly prevalent before and after heart transplantation (HTx). The impact of pretransplant and posttransplant iron disorder on posttransplant outcomes is unclear. ObjectivePretransplant serum levels of key regulator proteins of iron metabolism (hepcidin, interleukin-6, erythroferrone) were tested for prediction of the composite outcome 1-year posttransplant all-cause mortality (ACM) or [≥]moderate acute cellular rejection (ACR). Furthermore, serum levels of these proteins were measured at 1-year posttransplant to explore their posttransplant course and association with ACR. ResultsIn a multicenter cohort including 276 consecutive HTx recipients, patients with or without outcome (n=118/158, respectively) did not differ for pretransplant demographics, mismatch of donor/recipient sex, mismatch of HLA epitopes, and hepcidin or interleukin-6 levels. However, pretransplant erythroferrone levels were higher (1.40 vs. 1.19 ng/mL; p=0.013) and hemoglobin levels were lower (124.5 vs. 127 g/L; p=0.004) among patients with the composite outcome. Pretransplant erythroferrone levels >2.25 ng/ml (4th-quartile) were significantly associated with the composite outcome in multivariable analysis (OR 2.17; 95% CI 1.19-3.94, p=0.011; reference: 1st-3rd quartiles). In adjusted predicted proportions analysis, the incidence of the composite outcome was higher in 4th-quartile patients when compared to 1-3rd -quartiles patients (58.0 vs. 37.7%; p=0.003). At 1-year posttransplant, 80.4% of patients with pretransplant erythroferrone levels >2.25 ng/ml remained high; 88.4% of patients with pretransplant erythroferrone levels [≤]2.25 ng/ml had high levels posttransplant. In 1-year survivors with high erythroferrone levels and [≥]moderate ACR during the first postoperative year, the ratio of the opponent regulators of hepcidin gene expression, erythroferrone to interleukin-6, was higher when compared to those without ACR (1.18 vs. 0.41; p=0.016). Hepcidin levels were not different between these two subgroups indicating disproportionate ERFE increase. ConclusionHigh pretransplant erythroferrone levels predict the composite posttransplant outcome 1-year ACM and ACR. Disproportionately high posttransplant erythroferrone levels are related with [≥]moderate acute cellular rejection.
Arike, L.; Johansson, K.; Ermund, A.; Greer, M.; Pelaseyed, T.; Westin, J.; Hansson, G. C.; Magnusson, J. M.
Show abstract
IntroductionFreedom from chronic lung allograft dysfunction (CLAD) is a key objective after lung transplantation, yet predicting its onset remains challenging. This study investigated whether early proteomic changes in bronchoalveolar lavage fluid (BALF) can differentiate between patients maintaining stable graft function at 36 months and those developing CLAD within the first year. Additionally, findings were compared to proteomic data from non-transplanted individuals. MethodsBALF samples were collected at one and twelve months post-transplant from 43 lung transplant recipients together with clinical parameters. Proteomic analysis was performed using mass spectrometry with label-free quantification for global protein profiling and heavy-labelled peptides for absolute quantification of mucins and related proteins. Differentially expressed proteins were identified and analyzed through pathway enrichment to explore biological mechanisms associated with CLAD. ResultsNo significant proteomic differences were detected at one month. By twelve months, 63 proteins were differentially expressed between patients who developed early CLAD and those with stable function. Mucin levels declined in stable patients but remained elevated in both groups compared to healthy controls. Cartilage acidic protein 1 was significantly higher in stable patients at twelve months and correlated with better pulmonary function. Pathway analysis linked several altered proteins in CLAD patients to networks associated with lung injury and remodelling. ConclusionProtein profiles in BALF that resemble those of healthy lungs are associated with sustained graft function, while persistent expression of lung injury markers is associated with early CLAD. This suggests an adaptive process is needed for long-term post-transplant success.
Gispert Martinez, M.; Chorda Sanchez, M.; Rosello Castells, O.; Ruiz Arranz, A.; Castillo Garcia, J.
Show abstract
ObjectiveTo analyze the experience of the last six years with ECMO in Uncontrolled Donation after Circulatory Death (uDCD), assessing the clinical and logistical factors that determine donation effectiveness and the viability of retrieved organs, with the nurse perfusionist as the central figure in organ perfusion. MethodsRetrospective observational study of uDCD procedures performed at Hospital Clinic de Barcelona between June 2019 and October 2025. ResultsOf 184 out-of-hospital ECMO-CPR activations, 108 (58.7%) underwent perfusion; 72 donor cases (66.7%) were generated, and 109 kidneys (75.7%) and 3 livers (4.15%) were retrieved. The annual number of uDCD donors was heterogeneous. Compared with non-effective donors, effective donors were significantly younger (48.1 {+/-} 12.4 vs 53.0 {+/-} 10.7 years, p=0.03) and had fewer comorbidities such as hypertension (13.8% vs 33.0%, p=0.018) and diabetes (4.1% vs 16.6%, p=0.027). Although effective donors had a shorter cannulation time (25.6 {+/-} 13.9 vs 29.1 {+/-} 11.9 min, p=0.09), the difference was not statistically significant; however, cardiocompressor time did show a significant difference (58.9 {+/-} 17.7 vs 65.8 {+/-} 18.2 min, p=0.03). ConclusionsuDCD was a useful source of transplantable organs, mainly kidneys (two out of every three perfused patients became donors), in the current context of scarcity of brain-dead donors. Shorter warm ischemia times (cardiocompressor and cannulation times) were significantly associated with more effective organ donation. The multidisciplinary transplant team may benefit from perfusion professionals with expertise in extracorporeal oxygenation therapy.
Hussain, T.; Brahmbhatt, D. H.; Scolari, F. L.; Abelson, S.; Dick, J. E.; Billia, F.
Show abstract
Clonal hematopoiesis (CH) promotes inflammation and is associated with the development of cardiovascular disease. Previous studies assessing CH mutations in orthotopic heart transplant (OHT) recipients have revealed inconsistent findings, likely due to small sample size and differing sample collection time. In this study, we investigated the association between CH and post-transplant outcomes with a more consistent sample collection window. This retrospective study included 209 patients who underwent OHT between 2015 and 2022. Targeted sequencing detected CH mutations from samples obtained within a window of six months before or after transplantation. Clinical data were collected from the electronic medical record. Patients undergoing OHT had a median age of 53 years, and 27% were female. CH-associated mutations with a variant allele frequency (VAF) greater than 2% were detected in 29 patients (13.9%). The commonly mutated genes included DNMT3A, TET2, and ASXL1. CH mutations were associated with an increased risk of antibody-mediated rejection (AMR) (HR 2.42, 95% CI 1.07-5.47, p=0.033), but without detected differences in mortality or cardiac allograft vasculopathy (CAV). CH mutations detected at the time of transplant were associated with clinically significant AMR. Sample analysis at the time of transplant provides the clearest association between CH mutations and outcomes in OHT.
Kim, P. J.; Contijoch, F. J.; Morris, G. P.; Wong, D.; Nguyen, P. K.
Show abstract
BackgroundWe investigated the myocardial perfusion differences and changes in immune cell response in heart-transplant patients with nonspecific graft dysfunction (NGD) compared to cardiac allograft vasculopathy (CAV) patients and normal heart-transplant patients. Methods and ResultsWe prospectively studied 17 heart-transplant patients (59.8{+/-}14.1 years, 78% male) from January to June 2016. Regadenoson stress cardiac MRI was performed in the patients and peripheral blood obtained contemporaneously to isolate peripheral blood mononuclear cells (PBMCs). Stress myocardial perfusion showed significantly decreased myocardial perfusion using maximum upslope method in NGD and CAV patients compared to normal heart-transplant patients. Myocardial scar by late gadolinium enhancement also was significantly increased in nonspecific graft dysfunction patients compared to normal. Evaluation of PBMCs by flow cytometry showed a trend towards increased activated HLA-DR+ T cells in NGD patients compared to normal. Clinical outcomes for cardiac hospitalization, allograft loss/retransplant, death were assessed at 8 years. ConclusionsNGD shows decreased stress myocardial perfusion by cardiac MRI and a trend towards increased activated T cells in PBMCs, suggestive of an immune-mediated cause for allograft dysfunction.
Jang, M. K.; Oluwayiose, O.; Redekar, N.; Andargie, T. E.; Park, W.; Alnababteh, M.; Hill, T.; Phipps, K.; Kong, H.; Tian, X.; Luikart, H.; Solomon, M. A.; Shah, P.; Valantine, H. A.; Khush, K. K.; Agbor-Enoh, S.
Show abstract
BackgroundAntibody-mediated rejection (AMR) remains the major risk factor for allograft loss across all solid organ transplantation. Unfortunately, its diagnosis relies on biopsy, an invasive gold standard that often sample unaffected allograft tissue leading to missed diagnosis. Plasma donor-derived cell-free DNA (dd-cfDNA) is noninvasive biomarker that has high sensitivity but low specificity for AMR diagnosis. This proof-of-concept study assessed the utility of cell-free chromatin immunoprecipitation (cfChIP) as a surrogate for gene expression to detect cardiac AMR and the associated pathobiology. MethodsThe discovery GRAfT multicenter cohort of heart transplant patients (NCT02423070) identified AMR, acute cellular rejection (ACR), and stable controls based on biopsy and dd-cfDNA results. Plasma cfChIP-sequencing was performed to identify peaks, associated genes and pathobiological pathways. Plasma from an external cohort (GTD, NCT01985412) was also analyzed to verify pathways identified. Digital droplet PCR (ddPCR) assays targeting differential regions were constructed to test the diagnostic performance of cfDNA to detect AMR/ACR from stable controls (rejection-specific assays) or AMR from ACR (AMR-specific assays). ResultsThe cohort included 21 AMR, 28 ACR, and 45 stable controls from GRAfT and GTD, and 23 healthy controls. cfChIP detected expected active genes, including housekeeping genes and gene targets of transplant immunosuppressive drugs but not inactive genes. Unsupervised clustering of the discovery GRAfT cohort assigned 95% of samples correctly as AMR, ACR or stable control. Differential analysis identified pathobiological pathways of AMR such as neutrophil degranulation and complement activation. The pathways were consistent in GTD samples. Rejection-specific assays detected AMR/ACR from controls with AUC of 0.78 - 0.95. AMR-specific assays detected AMR from ACR with AUC of 0.71 - 0.85, sensitivities of 0.73 - 0.94 and specificities of 0.73 - 0.80. ConclusionThis study provides valuable preliminary data supporting the use of cfChIP to detect AMR and the associated pathobiological pathways.
Gavzy, S. J.; Kensiski, A.; Saxena, V.; Lakhan, R.; Hittle, L.; Iyyathurai, J.; Wu, L.; Dhakal, H.; Lee, Z. L.; Li, L.; Lee, Y.; Zhang, T.; Lwin, H. W.; Shirkey, M. W.; Paluskievicz, C.; Piao, W.; Mongodin, E. F.; Ma, B.; Bromberg, J. S.
Show abstract
BackgroundDespite ongoing improvements in regimens to prevent allograft rejection, most cardiac and other organ grafts eventually succumb to chronic vasculopathy, interstitial fibrosis, or endothelial changes, and eventually graft failure. The events leading to chronic rejection are still poorly understood and the gut microbiota is a known driving force in immune dysfunction. We previously showed that gut microbiota dysbiosis profoundly influences the outcome of vascularized cardiac allografts and subsequently identified biomarker species associated with these differential graft outcomes. MethodsIn this study, we further detailed the multifaceted immunomodulatory properties of pro-tolerogenic and pro-inflammatory bacterial species over time, using our clinically relevant model of allogenic heart transplantation. ResultsIn addition to tracing longitudinal changes in the recipient gut microbiome over time, we observed that Bifidobacterium pseudolongum (Bifido) induced an early anti-inflammatory phenotype within 7 days, while Desulfovibrio desulfuricans (Desulfo) resulted in a pro-inflammatory phenotype, defined by alterations in leukocyte distribution and lymph node (LN) structure. Indeed, in vitro results showed that Bifido and Desulfo acted directly on primary innate immune cells. However, by 40 days after treatment, these two bacterial strains were associated with mixed effects in their impact on LN architecture and immune cell composition and loss of colonization within gut microbiota, despite protection of allografts from inflammation with Bifido treatment. ConclusionsThese dynamic effects suggest a critical role for early microbiota-triggered immunological events such as innate immune cell engagement, T cell differentiation, and LN architectural changes in the subsequent modulation of pro-tolerant versus pro-inflammatory immune responses in organ transplant recipients.